Approximate posterior distributions for convolutional two-level hidden Markov models
نویسندگان
چکیده
A convolutional two-level hidden Markov model is defined and evaluated. The bottom level contains an unobserved categorical Markov chain, and given the variables in this level the middle level contains unobserved conditionally independent Gaussian variables. The top level contains observable variables that are a convolution of the variables in the middle level plus additive Gaussian errors. The objective of the study is to assess the categorical variables in the bottom level given the convolved observations in the top level. The inversion is cast in a Bayesian setting with a Markov chain prior model and convolved Gaussian likelihood model. The associated posterior model cannot be assessed since the normalizing constant is too computer demanding to calculate for realistic problems. Three approximate posterior models based on approximations of the likelihood model on generalized factorial form are defined. These approximations can be exactly assessed by the forward-backward algorithm. Both a synthetic case and a real seismic inversion case are used in an empirical evaluation. It is concluded that reliable and computer efficient approximate posterior models for convolutional two-level hidden Markov models can be defined.
منابع مشابه
Speech enhancement based on hidden Markov model using sparse code shrinkage
This paper presents a new hidden Markov model-based (HMM-based) speech enhancement framework based on the independent component analysis (ICA). We propose analytical procedures for training clean speech and noise models by the Baum re-estimation algorithm and present a Maximum a posterior (MAP) estimator based on Laplace-Gaussian (for clean speech and noise respectively) combination in the HMM ...
متن کاملBayesian Learning in Undirected Graphical Models: Approximate MCMC Algorithms
Bayesian learning in undirected graphical models—computing posterior distributions over parameters and predictive quantities— is exceptionally difficult. We conjecture that for general undirected models, there are no tractable MCMC (Markov Chain Monte Carlo) schemes giving the correct equilibrium distribution over parameters. While this intractability, due to the partition function, is familiar...
متن کاملIntroducing Busy Customer Portfolio Using Hidden Markov Model
Due to the effective role of Markov models in customer relationship management (CRM), there is a lack of comprehensive literature review which contains all related literatures. In this paper the focus is on academic databases to find all the articles that had been published in 2011 and earlier. One hundred articles were identified and reviewed to find direct relevance for applying Markov models...
متن کاملA Two-Stage Pretraining Algorithm for Deep Boltzmann Machines
A deep Boltzmann machine (DBM) is a recently introduced Markov random field model that has multiple layers of hidden units. It has been shown empirically that it is difficult to train a DBMwith approximate maximum-likelihood learning using the stochastic gradient unlike its simpler special case, restricted Boltzmann machine (RBM). In this paper, we propose a novel pretraining algorithm that con...
متن کاملImage Segmentation using Gaussian Mixture Model
Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Computational Statistics & Data Analysis
دوره 58 شماره
صفحات -
تاریخ انتشار 2013